Semantic Word Embedding (SWE)

نویسنده

  • Quan Liu
چکیده

......................................................................................................................... 2 Chapter 1 Semantic Word Embedding ........................................................................ 3 1.1 The Skip-gram moel ..................................................................................... 3 1.2 SWE as Constrained Optimization ............................................................... 3 Chapter 2 Ordinal Semantic Constraints..................................................................... 5 2.1 Representing Knowledge By Ranking .......................................................... 5 2.2 Three Common Sense Rules ......................................................................... 5 Chapter 3 SWE Toolkit ............................................................................................... 6 Chapter 4 SWE Applications ...................................................................................... 7 4.1 Word Similarity ............................................................................................. 7 4.1.1 Task description .................................................................................... 7 4.1.2 SWE Demo Instruction ......................................................................... 7 4.2 Sentence completion ..................................................................................... 9 4.2.1 Task description .................................................................................... 9 4.2.2 SWE Demo Instruction ......................................................................... 9 4.3 Name entity recognition .............................................................................. 10 4.3.1 Task description .................................................................................. 10 4.3.2 SWE Demo Instruction ....................................................................... 11 4.4 Synonym selection ...................................................................................... 12 4.4.1 Task description .................................................................................. 12 4.4.2 SWE Demo Instruction ....................................................................... 12 5. Final Remarks .......................................................................................................... 13 References .................................................................................................................... 13 Semantic Word Embedding (SWE) reference manual http://home.ustc.edu.cn/~quanliu/

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Exploring Semantic Representation in Brain Activity Using Word Embeddings

In this paper, we utilize distributed word representations (i.e., word embeddings) to analyse the representation of semantics in brain activity. The brain activity data were recorded using functional magnetic resonance imaging (fMRI) when subjects were viewing words. First, we analysed the functional selectivity of different cortex areas by calculating the correlations between neural responses ...

متن کامل

Learning Semantic Word Embeddings based on Ordinal Knowledge Constraints

In this paper, we propose a general framework to incorporate semantic knowledge into the popular data-driven learning process of word embeddings to improve the quality of them. Under this framework, we represent semantic knowledge as many ordinal ranking inequalities and formulate the learning of semantic word embeddings (SWE) as a constrained optimization problem, where the data-derived object...

متن کامل

A Joint Semantic Vector Representation Model for Text Clustering and Classification

Text clustering and classification are two main tasks of text mining. Feature selection plays the key role in the quality of the clustering and classification results. Although word-based features such as term frequency-inverse document frequency (TF-IDF) vectors have been widely used in different applications, their shortcoming in capturing semantic concepts of text motivated researches to use...

متن کامل

Unsupervised Learning of Word Semantic Embedding using the Deep Structured Semantic Model

Deep neural network (DNN) based natural language processing models rely on a word embedding matrix to transform raw words into vectors. Recently, a deep structured semantic model (DSSM) has been proposed to project raw text to a continuously-valued vector for Web Search. In this technical report, we propose learning word embedding using DSSM. We show that the DSSM trained on large body of text ...

متن کامل

Combining Word Embedding and Lexical Database for Semantic Relatedness Measurement

While many traditional studies on semantic relatedness utilize the lexical databases, such as WordNet or Wikitionary, the recent word embedding learning approaches demonstrate their abilities to capture syntactic and semantic information, and outperform the lexicon-based methods. However, word senses are not disambiguated in the training phase of both Word2Vec and GloVe, two famous word embeddi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015